skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Laefer, Debra F"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Introduction: Without community-based, data-aggregation tools, timely and meaningful local input into brownfield management is not tenable, as relevant data are dispersed and often incomplete. in response, this project lays the groundwork through which constructive dialogue between community members and local officials can be facilitated.Materials and methods: a Brownfield engagement tool (Bet) is envisioned as a means by which non-experts can use disparately held open data streams to collect, analyse, and visualise brownfield site data, better understand aggregate health risks, and provide direct input into remediation and redevelopment decisions. By raising awareness and providing knowledge about brownfield related issues, the Bet is intended as a means to encourage community member participation in public debate. this concept is demonstrated for a 113-hectare Brooklyn, New York neighbourhood with a long history of industrial and mixed-use development resulting in 18 brownfields. the proposed remediation prioritization strategy offers a systematic analysis of the sites’ size, contaminants, and proximity to gathering spots and demographics.Results: the Bet proposed in this paper offers a novel approach for community-based management of brownfields done at the census tract level and based on factors that most affect the local community. By combining publicly-available, municipal, state, and federal data in the Bet, a set of easy-to-understand metrics can be generated through which a community can compare and rank existing brownfields to prioritize future interventions and can be used as a support system for raising funding and investments to address neighbourhood issues. this type of approach is the first of its kind with respect to brownfield redevelopment. 
    more » « less
    Free, publicly-accessible full text available December 31, 2026
  2. Unlike aboveground utility systems, for which very detailed and accurate information exists, there is generally a dearth of good-quality data about underground utility infrastructures that provide vital services. To identify key strategies to improve the resilience of these underground systems, this paper presents mechanisms for successful engagement and collaboration among stakeholders and shared cross-sector system vulnerability concerns (including data availability) based on the innova- tive use of focus groups. Outputs from two virtual focus groups were used to obtain information from New York City area utilities and other stakeholders affected by underground infrastructure. There was strong agreement among participants that (1) a trusted agency in New York City government should manage a detailed map of underground infrastructure that would allow stakeholders to securely access appropriate information about underground systems on a need-to-know basis; (2) environmental risk factors, such as infrastructure age and condition, as well as location should be included; and (3) improved mechanisms for collaboration and sharing information are needed, especially during non-emergency situations. Stakeholders also highlighted the need for a regularly updated central database of relevant contacts at key organizations, since institutions often have a high employee turnover rate, which creates knowledge loss. The focus group script developed as part of this research was designed to be transferable to other cities to assess data needs and potential obstacles to stakeholder collabora- tion in the areas of underground infrastructure mapping and modeling. 
    more » « less
    Free, publicly-accessible full text available March 1, 2026
  3. This paper proposes a flood risk visualization method that is (1) readily transferable (2) hyperlocal, (3) computationally inexpensive, and (4) geometrically accurate. This proposal is for risk communication, to provide high-resolution, three-dimensional flood visualization at the sub-meter level. The method couples a laser scanning point cloud with algorithms that produce textured floodwaters, achieved through compounding multiple sine functions in a graphics shader. This hyper-local approach to visualization is enhanced by the ability to portray changes in (i) watercolor, (ii) texture, and (iii) motion (including dynamic heights) for various flood prediction scenarios. Through decoupling physics-based predictions from the visualization, a dynamic, flood risk viewer was produced with modest processing resources involving only a single, quad-core processor with a frequency around 4.30 GHz and with no graphics card. The system offers several major advantages. (1) The approach enables its use on a browser or with inexpensive, virtual reality hardware and, thus, promotes local dissemination for flood risk communication, planning, and mitigation. (2) The approach can be used for any scenario where water interfaces with the built environment, including inside of pipes. (3) When tested for a coastal inundation scenario from a hurricane, 92% of the neighborhood participants found it to be more effective in communicating flood risk than traditional 2D mapping flood warnings provided by governmental authorities. 
    more » « less
    Free, publicly-accessible full text available February 1, 2026
  4. A utilidor is a ‘system of systems’ infrastructural solution to the ‘subsurface spaghetti’ problem resulting from direct burial of utility transmission infrastructure beneath the public right of way (PROW). The transition from direct burial to utilidors in older, dense American cities has generally not occurred, despite the potential to increase system performance in a long-term, !nancially and environmentally sustainable manner, because it would require reform of local planning practices and of utility pricing to support !nancing within a complex regulatory system. Utilidor adoption in New York City (NYC) would be a signi!cant local infrastructure transition, amplifying the need for localitybased research, that would occur while each utility sector undergoes its own infrastructure transitions, thereby increasing the level of regulatory complexity. This paper applies transitions analysis, recursive collective action theory, and capacity to act analysis to NYC’s experience with its PROW subsurface spaghetti problem and utilidor implementation to demonstrate a placebased methodology that identi!es speci!c sources of resistance to innovative subsurface design and feasible pathways for resolving them. This methodology would be transferable for application to other American cities or classes of American cities to supplement the limits of generalised subsurface and subsurface resource integration research for practitioner application. 
    more » « less
  5. Abstract Street view imagery databases such as Google Street View, Mapillary, and Karta View provide great spatial and temporal coverage for many cities globally. Those data, when coupled with appropriate computer vision algorithms, can provide an effective means to analyse aspects of the urban environment at scale. As an effort to enhance current practices in urban flood risk assessment, this project investigates a potential use of street view imagery data to identify building features that indicate buildings’ vulnerability to flooding (e.g., basements and semi-basements). In particular, this paper discusses (1) building features indicating the presence of basement structures, (2) available imagery data sources capturing those features, and (3) computer vision algorithms capable of automatically detecting the features of interest. The paper also reviews existing methods for reconstructing geometry representations of the extracted features from images and potential approaches to account for data quality issues. Preliminary experiments were conducted, which confirmed the usability of the freely available Mapillary images for detecting basement railings as an example type of basement features, as well as geolocating the features. 
    more » « less
  6. Current state-of-the-art point cloud data management (PCDM) systems rely on a variety of parallel architectures and diverse data models. The main objective of these implementations is achieving higher scalability without compromising performance. This paper reviews the scalability and performance of state-of-the-art PCDM systems with respect to both parallel architectures and data models. More specifically, in terms of parallel architectures, shared-memory architecture, shared-disk architecture, and shared-nothing architecture are considered. In terms of data models, relational models, and novel data models (such as wide-column models) are considered. New structured query language (NewSQL) models are considered. The impacts of parallel architectures and data models are discussed with respect to theoretical perspectives and in the context of existing PCDM implementations. Based on the review, a methodical approach for the selection of parallel architectures and data models for highly scalable and performance-efficient PCDM system development is proposed. Finally, notable research gaps in the PCDM literature are presented as possible directions for future research. 
    more » « less
  7. null (Ed.)
  8. Abstract An adaptive, adversarial methodology is developed for the optimal transport problem between two distributions $$\mu $$ and $$\nu $$, known only through a finite set of independent samples $$(x_i)_{i=1..n}$$ and $$(y_j)_{j=1..m}$$. The methodology automatically creates features that adapt to the data, thus avoiding reliance on a priori knowledge of the distributions underlying the data. Specifically, instead of a discrete point-by-point assignment, the new procedure seeks an optimal map $T(x)$ defined for all $$x$$, minimizing the Kullback–Leibler divergence between $$(T(x_i))$$ and the target $$(y_j)$$. The relative entropy is given a sample-based, variational characterization, thereby creating an adversarial setting: as one player seeks to push forward one distribution to the other, the second player develops features that focus on those areas where the two distributions fail to match. The procedure solves local problems that seek the optimal transfer between consecutive, intermediate distributions between $$\mu $$ and $$\nu $$. As a result, maps of arbitrary complexity can be built by composing the simple maps used for each local problem. Displaced interpolation is used to guarantee global from local optimality. The procedure is illustrated through synthetic examples in one and two dimensions. 
    more » « less